190 research outputs found

    Determining the Importance of Macro and Trace Dietary Minerals on Growth and Nutrient Retention in Juvenile Penaeus monodon

    Get PDF
    Twelve minerals were screened to identify key dietary minerals important for Penaeus monodon growth. The minerals selected included boron, calcium plus phosphorus (assessed in tandem at a 1:1 ratio), cobalt, copper, magnesium, manganese, potassium, selenium, sodium, strontium and zinc. Twelve purified casein/gelatin-based diets were formulated and contained minerals at two levels: below expected requirements, as attributed by the basal formulation (-) and above expected requirements by adding inorganic minerals (+). The two levels were allocated to dietary treatments in juvenile prawns in accordance with the PB design. A two-level screening analysis was employed where effect of each mineral at level - or + across twelve diets were compared to identify the minerals of importance for culture performance of juvenile prawns. Calcium plus phosphorus (at a 1:1 ratio), magnesium, boron, manganese, selenium and zinc produced the greatest positive effects on weight gain, feed conversion efficiency, biomass gain and nutrient/energy retention. Particularly, boron and manganese significantly increased retention of key macronutrients and energy including gross energy, crude protein and crude lipid. Our study demonstrates the importance of several macro and trace minerals in prawn diets and the pressing need to refine their requirements for P. monodon

    Mineral nutrition in penaeid shrimp

    Get PDF
    This review summarises the current knowledge of mineral nutrition for penaeid shrimp. It investigates how the aquatic environment and the lifecycle of shrimp affect requirements and the role that minerals play in shrimp health. Methods of supplying minerals via either water or to feed, and novel ways of supplementing minerals within feed, are discussed. The requirements for individual minerals are summarised with recommendations for minimum levels of dietary inclusion for semi-intensive and intensive commercial shrimp culture presented where data permits. Estimates of dietary requirement remain broad for most minerals for the main shrimp production species (Penaeus vannamei, Penaeus monodon and Penaeus japonicus), with some essential minerals remaining unstudied (Table 2 in Section 5.10). Mineral nutrition will become more important as intensification and diversification of production systems provide new challenges to shrimp aquaculture.</p

    Increased respiratory viral detection and symptom burden among patients with primary antibody deficiency: results from the BIPAD study

    Get PDF
    Background Patients with primary antibody deficiency (PAD) are at increased risk of respiratory tract infections, but our understanding of their nature and consequences remains limited. Objective To define the symptomatic and microbial burden of upper airway infection in adults with PAD relative to age-matched controls. Methods Prospective 12-month observational study consisting of a daily upper and lower airway symptom score alongside fortnightly nasal swab with molecular detection of 19 pathogen targets. Results A total of 44 patients and 42 controls (including 34 household pairs) were recruited, providing more than 22,500 days of symptom scores and 1,496 nasal swabs. Swab and questionnaire compliance exceeded 70%. At enrollment, 64% of patients received prophylactic antibiotics, with a 34% prevalence of bronchiectasis. On average, patients with PAD experienced symptomatic respiratory exacerbations every 6 days compared with 6 weeks for controls, associated with significant impairment of respiratory-specific quality-of-life scores. Viral detections were associated with worsening of symptom scores from a participant's baseline. Patients with PAD had increased odds ratio (OR) for pathogen detection, particularly viral (OR, 2.73; 95% CI, 2.09-3.57), specifically human rhinovirus (OR, 3.60; 95% CI, 2.53-5.13) and parainfluenza (OR, 3.06; 95% CI, 1.25-7.50). Haemophilus influenzae and Streptococcus pneumoniae were also more frequent in PAD. Young child exposure, IgM deficiency, and presence of bronchiectasis were independent risk factors for viral detection. Prophylactic antibiotic use was associated with a lower risk of bacterial detection by PCR. Conclusions Patients with PAD have a significant respiratory symptom burden associated with increased viral infection frequency despite immunoglobulin replacement and prophylactic antibiotic use. This highlights a clear need for future therapeutic trials in the population with PAD, and informs future study design

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p&lt;0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p&lt;0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p&lt;0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    The state of the Martian climate

    Get PDF
    60°N was +2.0°C, relative to the 1981–2010 average value (Fig. 5.1). This marks a new high for the record. The average annual surface air temperature (SAT) anomaly for 2016 for land stations north of starting in 1900, and is a significant increase over the previous highest value of +1.2°C, which was observed in 2007, 2011, and 2015. Average global annual temperatures also showed record values in 2015 and 2016. Currently, the Arctic is warming at more than twice the rate of lower latitudes

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p&lt;0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p&lt;0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p&gt;0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p&lt;0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Attaching and effacing (A/E) lesion formation by enteropathogenic E. coli on human intestinal mucosa is dependent on non-LEE effectors

    Get PDF
    Enteropathogenic E. coli (EPEC) is a human pathogen that causes acute and chronic pediatric diarrhea. The hallmark of EPEC infection is the formation of attaching and effacing (A/E) lesions in the intestinal epithelium. Formation of A/E lesions is mediated by genes located on the pathogenicity island locus of enterocyte effacement (LEE), which encode the adhesin intimin, a type III secretion system (T3SS) and six effectors, including the essential translocated intimin receptor (Tir). Seventeen additional effectors are encoded by genes located outside the LEE, in insertion elements and prophages. Here, using a stepwise approach, we generated an EPEC mutant lacking the entire effector genes (EPEC0) and intermediate mutants. We show that EPEC0 contains a functional T3SS. An EPEC mutant expressing intimin but lacking all the LEE effectors but Tir (EPEC1) was able to trigger robust actin polymerization in HeLa cells and mucin-producing intestinal LS174T cells. However, EPEC1 was unable to form A/E lesions on human intestinal in vitro organ cultures (IVOC). Screening the intermediate mutants for genes involved in A/E lesion formation on IVOC revealed that strains lacking non-LEE effector/s have a marginal ability to form A/E lesions. Furthermore, we found that Efa1/LifA proteins are important for A/E lesion formation efficiency in EPEC strains lacking multiple effectors. Taken together, these results demonstrate the intricate relationships between T3SS effectors and the essential role non-LEE effectors play in A/E lesion formation on mucosal surfaces
    corecore